28 research outputs found

    Reducing bias in auditory duration reproduction by integrating the reproduced signal

    Get PDF
    Duration estimation is known to be far from veridical and to differ for sensory estimates and motor reproduction. To investigate how these differential estimates are integrated for estimating or reproducing a duration and to examine sensorimotor biases in duration comparison and reproduction tasks, we compared estimation biases and variances among three different duration estimation tasks: perceptual comparison, motor reproduction, and auditory reproduction (i.e. a combined perceptual-motor task). We found consistent overestimation in both motor and perceptual-motor auditory reproduction tasks, and the least overestimation in the comparison task. More interestingly, compared to pure motor reproduction, the overestimation bias was reduced in the auditory reproduction task, due to the additional reproduced auditory signal. We further manipulated the signal-to-noise ratio (SNR) in the feedback/comparison tones to examine the changes in estimation biases and variances. Considering perceptual and motor biases as two independent components, we applied the reliability-based model, which successfully predicted the biases in auditory reproduction. Our findings thus provide behavioral evidence of how the brain combines motor and perceptual information together to reduce duration estimation biases and improve estimation reliability

    Distortions of Subjective Time Perception Within and Across Senses

    Get PDF
    Background: The ability to estimate the passage of time is of fundamental importance for perceptual and cognitive processes. One experience of time is the perception of duration, which is not isomorphic to physical duration and can be distorted by a number of factors. Yet, the critical features generating these perceptual shifts in subjective duration are not understood. Methodology/Findings: We used prospective duration judgments within and across sensory modalities to examine the effect of stimulus predictability and feature change on the perception of duration. First, we found robust distortions of perceived duration in auditory, visual and auditory-visual presentations despite the predictability of the feature changes in the stimuli. For example, a looming disc embedded in a series of steady discs led to time dilation, whereas a steady disc embedded in a series of looming discs led to time compression. Second, we addressed whether visual (auditory) inputs could alter the perception of duration of auditory (visual) inputs. When participants were presented with incongruent audio-visual stimuli, the perceived duration of auditory events could be shortened or lengthened by the presence of conflicting visual information; however, the perceived duration of visual events was seldom distorted by the presence of auditory information and was never perceived shorter than their actual durations. Conclusions/Significance: These results support the existence of multisensory interactions in the perception of duration and, importantly, suggest that vision can modify auditory temporal perception in a pure timing task. Insofar as distortions in subjective duration can neither be accounted for by the unpredictability of an auditory, visual or auditory-visual event, we propose that it is the intrinsic features of the stimulus that critically affect subjective time distortions

    Pupil response hazard rates predict perceived gaze durations

    Get PDF
    We investigated the mechanisms for evaluating perceived gaze-shift duration. Timing relies on the accumulation of endogenous physiological signals. Here we focused on arousal, measured through pupil dilation, as a candidate timing signal. Participants timed gaze-shifts performed by face stimuli in a Standard/Probe comparison task. Pupil responses were binned according to “Longer/Shorter” judgements in trials where Standard and Probe were identical. This ensured that pupil responses reflected endogenous arousal fluctuations opposed to differences in stimulus content. We found that pupil hazard rates predicted the classification of sub-second intervals (steeper dilation =“Longer” classifications). This shows that the accumulation of endogenous arousal signals informs gaze-shift timing judgements. We also found that participants relied exclusively on the 2nd stimulus to perform the classification, providing insights into timing strategies under conditions of maximum uncertainty. We observed no dissociation in pupil responses when timing equivalent neutral spatial displacements, indicating that a stimulus-dependent timer exploits arousal to time gaze-shifts

    Do Changes in the Pace of Events Affect One-Off Judgments of Duration?

    Get PDF
    Five experiments examined whether changes in the pace of external events influence people’s judgments of duration. In Experiments 1a–1c, participants heard pieces of music whose tempo accelerated, decelerated, or remained constant. In Experiment 2, participants completed a visuo-motor task in which the rate of stimulus presentation accelerated, decelerated, or remained constant. In Experiment 3, participants completed a reading task in which facts appeared on-screen at accelerating, decelerating, or constant rates. In all experiments, the physical duration of the to-be-judged interval was the same across conditions. We found no significant effects of temporal structure on duration judgments in any of the experiments, either when participants knew that a time estimate would be required (prospective judgments) or when they did not (retrospective judgments). These results provide a starting point for the investigation of how temporal structure affects one-off judgments of duration like those typically made in natural settings

    Activity in perceptual classification networks as a basis for human subjective time perception

    Get PDF
    Despite being a fundamental dimension of experience, how the human brain generates the perception of time remains unknown. Here, we provide a novel explanation for how human time perception might be accomplished, based on non-temporal perceptual classification processes. To demonstrate this proposal, we build an artificial neural system centred on a feed-forward image classification network, functionally similar to human visual processing. In this system, input videos of natural scenes drive changes in network activation, and accumulation of salient changes in activation are used to estimate duration. Estimates produced by this system match human reports made about the same videos, replicating key qualitative biases, including differentiating between scenes of walking around a busy city or sitting in a cafe or office. Our approach provides a working model of duration perception from stimulus to estimation and presents a new direction for examining the foundations of this central aspect of human experience

    Cross-Modal Distortion of Time Perception: Demerging the Effects of Observed and Performed Motion

    Get PDF
    Temporal information is often contained in multi-sensory stimuli, but it is currently unknown how the brain combines e.g. visual and auditory cues into a coherent percept of time. The existing studies of cross-modal time perception mainly support the “modality appropriateness hypothesis”, i.e. the domination of auditory temporal cues over visual ones because of the higher precision of audition for time perception. However, these studies suffer from methodical problems and conflicting results. We introduce a novel experimental paradigm to examine cross-modal time perception by combining an auditory time perception task with a visually guided motor task, requiring participants to follow an elliptic movement on a screen with a robotic manipulandum. We find that subjective duration is distorted according to the speed of visually observed movement: The faster the visual motion, the longer the perceived duration. In contrast, the actual execution of the arm movement does not contribute to this effect, but impairs discrimination performance by dual-task interference. We also show that additional training of the motor task attenuates the interference, but does not affect the distortion of subjective duration. The study demonstrates direct influence of visual motion on auditory temporal representations, which is independent of attentional modulation. At the same time, it provides causal support for the notion that time perception and continuous motor timing rely on separate mechanisms, a proposal that was formerly supported by correlational evidence only. The results constitute a counterexample to the modality appropriateness hypothesis and are best explained by Bayesian integration of modality-specific temporal information into a centralized “temporal hub”

    The Effect of Predictability on Subjective Duration

    Get PDF
    Events can sometimes appear longer or shorter in duration than other events of equal length. For example, in a repeated presentation of auditory or visual stimuli, an unexpected object of equivalent duration appears to last longer. Illusions of duration distortion beg an important question of time representation: when durations dilate or contract, does time in general slow down or speed up during that moment? In other words, what entailments do duration distortions have with respect to other timing judgments? We here show that when a sound or visual flicker is presented in conjunction with an unexpected visual stimulus, neither the pitch of the sound nor the frequency of the flicker is affected by the apparent duration dilation. This demonstrates that subjective time in general is not slowed; instead, duration judgments can be manipulated with no concurrent impact on other temporal judgments. Like spatial vision, time perception appears to be underpinned by a collaboration of separate neural mechanisms that usually work in concert but are separable. We further show that the duration dilation of an unexpected stimulus is not enhanced by increasing its saliency, suggesting that the effect is more closely related to prediction violation than enhanced attention. Finally, duration distortions induced by violations of progressive number sequences implicate the involvement of high-level predictability, suggesting the involvement of areas higher than primary visual cortex. We suggest that duration distortions can be understood in terms of repetition suppression, in which neural responses to repeated stimuli are diminished

    Crossmodal duration perception involves perceptual grouping, temporal ventriloquism, and variable internal clock rates

    Get PDF
    Here, we investigate how audiovisual context affects perceived event duration with experiments in which observers reported which of two stimuli they perceived as longer. Target events were visual and/or auditory and could be accompanied by nontargets in the other modality. Our results demonstrate that the temporal information conveyed by irrelevant sounds is automatically used when the brain estimates visual durations but that irrelevant visual information does not affect perceived auditory duration (Experiment 1). We further show that auditory influences on subjective visual durations occur only when the temporal characteristics of the stimuli promote perceptual grouping (Experiments 1 and 2). Placed in the context of scalar expectancy theory of time perception, our third and fourth experiments have the implication that audiovisual context can lead both to changes in the rate of an internal clock and to temporal ventriloquism-like effects on perceived on- and offsets. Finally, intramodal grouping of auditory stimuli diminished any crossmodal effects, suggesting a strong preference for intramodal over crossmodal perceptual grouping (Experiment 5)

    Dissociation between the Activity of the Right Middle Frontal Gyrus and the Middle Temporal Gyrus in Processing Semantic Priming

    Get PDF
    The aim of this event-related functional magnetic resonance imaging (fMRI) study was to test whether the right middle frontal gyrus (MFG) and middle temporal gyrus (MTG) would show differential sensitivity to the effect of prime-target association strength on repetition priming. In the experimental condition (RP), the target occurred after repetitive presentation of the prime within an oddball design. In the control condition (CTR), the target followed a single presentation of the prime with equal probability of the target as in RP. To manipulate semantic overlap between the prime and the target both conditions (RP and CTR) employed either the onomatopoeia “oink” as the prime and the referent “pig” as the target (OP) or vice-versa (PO) since semantic overlap was previously shown to be greater in OP. The results showed that the left MTG was sensitive to release of adaptation while both the right MTG and MFG were sensitive to sequence regularity extraction and its verification. However, dissociated activity between OP and PO was revealed in RP only in the right MFG. Specifically, target “pig” (OP) and the physically equivalent target in CTR elicited comparable deactivations whereas target “oink” (PO) elicited less inhibited response in RP than in CTR. This interaction in the right MFG was explained by integrating these effects into a competition model between perceptual and conceptual effects in priming processing
    corecore